New í-Support Vector Machines and their Sequential Minimal Optimization
نویسندگان
چکیده
Although the v-Support Vector Machine, v-SVM, (SchSlkopf et al., 2000) has the advantage of using a single parameter v to control both the number of support vectors and the fraction of margin errors, there are two issues that prevent it from being used in many real world applications. First, unlike the C-SVM that allows asymmetric misclassification cost, v-SVM uses a symmetric misclassification cost. While lower error rate is promoted by this symmetric mi.qclassification cost, it is not always the preferred measure in many applications. Second, the additional constraint from vSVM makes its training more difficult. Sequential Minimal Optimization (SMO) algorithms that are very easy to implement and scalable to very large problems do not exist in a good form for v-SVM. In this paper, we proposed two new v-SVM formulations. These formulations introduce means to control the misclassification cost ratio between false positives and false negative, while preserving the intuitive parameter v. We also propose a SMO algorithm for the v-SVM classification problem. Experiments show that our new v-SVM formulation is effective in incorporating asymmetric misclassification cost, and the SMO algorithm for v-SVM is comparable in speed to that for C-SVM.
منابع مشابه
Single Directional SMO Algorithm for Least Squares Support Vector Machines
Working set selection is a major step in decomposition methods for training least squares support vector machines (LS-SVMs). In this paper, a new technique for the selection of working set in sequential minimal optimization- (SMO-) type decomposition methods is proposed. By the new method, we can select a single direction to achieve the convergence of the optimality condition. A simple asymptot...
متن کاملSequential minimal optimization: A fast Algorithm for Training Support Vector machines
This paper proposes a new algorithm for training support vector machines: Sequential Minimal Optimization, or SMO. Training a support vector machine requires the solution of a very large quadratic programming (QP) optimization problem. SMO breaks this large QP problem into a series of smallest possible QP problems. These small QP problems are solved analytically, which avoids using a time-consu...
متن کاملMax-Margin Learning of Gaussian Mixtures with Sequential Minimal Optimization
This works deals with discriminant training of Gaussian Mixture Models through margin maximization. We go one step further previous work, we propose a new formulation of the learning problem that allows the use of efficient optimization algorithm popularized for Support Vector Machines, yielding improved convergence properties and recognition accuracy on handwritten digits recognition.
متن کاملFast Linear Optimisation with Automatically Biased Support Vector Machines
We propose a new Support Vector Machine classifier formulation which allows for an automatic computation of the bias and eliminates the equality constraint. We also present a new training algorithm, which is capable of providing fast training for our automatically biased SVM. We then show that this method allows for the application of acceleration methods which further increases the rates of co...
متن کاملA study on SMO-type decomposition methods for support vector machines
Decomposition methods are currently one of the major methods for training support vector machines. They vary mainly according to different working set selections. Existing implementations and analysis usually consider some specific selection rules. This paper studies sequential minimal optimization type decomposition methods under a general and flexible way of choosing the two-element working s...
متن کامل